Search results for "Iterative refinement"

showing 2 items of 2 documents

A normalized iterative Smoothed Particle Hydrodynamics method

2020

Abstract In this paper we investigate on a normalized iterative approach to improve the Smoothed Particle Hydrodynamics (SPH) estimate of a function. The method iterates on the residuals of an initial SPH approximation to obtain a more accurate solution. The iterative strategy preserves the matrix-free nature of the method, does not require changes on the kernel function and it is not affected by disordered data distribution. The iterative refinement is further improved by ensuring linear approximation order to the starting iterative values. We analyze the accuracy and the convergence of the method with the standard and normalized formulation giving evidence of the enhancements obtained wit…

Numerical AnalysisGeneral Computer ScienceApplied Mathematics010103 numerical & computational mathematics02 engineering and technologyFunction (mathematics)01 natural sciencesDomain (mathematical analysis)Theoretical Computer ScienceSmoothed-particle hydrodynamicsSettore MAT/08 - Analisi NumericaDistribution (mathematics)Iterated residuals Normalized Smoothed Particle Hydrodynamics Accuracy ConvergenceIterated functionIterative refinementModeling and SimulationConvergence (routing)0202 electrical engineering electronic engineering information engineeringApplied mathematics020201 artificial intelligence & image processingLinear approximation0101 mathematicsMathematics
researchProduct

Feature Selection for Ensembles of Simple Bayesian Classifiers

2002

A popular method for creating an accurate classifier from a set of training data is to train several classifiers, and then to combine their predictions. The ensembles of simple Bayesian classifiers have traditionally not been a focus of research. However, the simple Bayesian classifier has much broader applicability than previously thought. Besides its high classification accuracy, it also has advantages in terms of simplicity, learning speed, classification speed, storage space, and incrementality. One way to generate an ensemble of simple Bayesian classifiers is to use different feature subsets as in the random subspace method. In this paper we present a technique for building ensembles o…

Training setComputer sciencebusiness.industryBayesian probabilityPattern recognitionFeature selectionMachine learningcomputer.software_genreLinear subspaceRandom subspace methodNaive Bayes classifierComputingMethodologies_PATTERNRECOGNITIONIterative refinementArtificial intelligencebusinesscomputerClassifier (UML)Cascading classifiers
researchProduct